Minimum entropy queries for linear students learning nonlinear rules

نویسنده

  • Peter Sollich
چکیده

We study the fundamental question of how query learn ing performs in imperfectly learnable problems where the student can only learn to approximate the teacher Considering as a prototypical sce nario a linear perceptron student learning a general nonlinear perceptron teacher we nd that queries for minimum entropy in student space i e maximum information gain lead to the same improvement in generaliza tion performance as for a noisy linear teacher Qualitatively the e cacy of query learning is thus determined by the structure of the student space alone we speculate that this result holds more generally for minimum student space entropy queries in imperfectly learnable problems

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning unrealizable tasks from minimum entropyqueries

In supervised learning, learning from queries rather than from random examples can improve generalization performance signiicantly. We study the performance of query learning for unrealizable tasks, where the student cannot learn the teacher perfectly. As a simple model scenario of this kind, we consider a linear perceptron student learning a general nonlinear perceptron teacher. Two kinds of q...

متن کامل

Learning from queries for maximum information gain in imperfectly learnable problems

In supervised learning, learning from queries rather than from random examples can improve generalization performance significantly. We study the performance of query learning for problems where the student cannot learn the teacher perfectly, which occur frequently in practice. As a prototypical scenario of this kind, we consider a linear perceptron student learning a binary perceptron teacher....

متن کامل

Learning from queries for maximuminformation gain in imperfectly learnableproblemsPeter

In supervised learning, learning from queries rather than from random examples can improve generalization performance significantly. We study the performance of query learning for problems where the student cannot learn the teacher perfectly, which occur frequently in practice. As a prototypical scenario of this kind, we consider a linear perceptron student learning a binary perceptron teacher....

متن کامل

An On-line Adaptation Algorithm for Adaptive System Training with Minimum Error Entropy: Stochastic Information Gradient

We have recently reported on the use of minimum error entropy criterion as an alternative to minimum square error (MSE) in supervised adaptive system training. A nonparametric estimator for Renyi’s entropy was formulated by employing Parzen windowing. This formulation revealed interesting insights about the process of information theoretical learning, namely information potential and informatio...

متن کامل

Independent component analysis by general nonlinear Hebbian-like learning rules

A number of neural learning rules have been recently proposed for independent component analysis (ICA). The rules are usually derived from information-theoretic criteria such as maximum entropy or minimum mutual information. In this paper, we show that in fact, ICA can be performed by very simple Hebbian or anti-Hebbian learning rules, which may have only weak relations to such information-theo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995